126 research outputs found
Multi-sensory media experiences
The way we experience the world is based on our five senses, which allow us unique and often surprising sensations of our environment. Interactive technologies are mainly stimulating our senses of vision and hearing, partly our sense of touch, and the sense of taste and smell are widely under-exploited. There is however a growing international interest of the film, video, and game industries in more immersive viewing and gaming experiences. In the 20th century there was a demand for a controllable way to describe colours that initiated intense research on the descriptions of colours and substantially contributed to advances in computer graphics, image processing, photography and cinematography. Similarly, the 21st century now demands an investigation of touch, taste, and smell as sensory interaction modalities to enhance media experiences
Reflection on the design of food systems and experiences for sustainable transformations
The importance of food and technology in modern society is undeniable. Technological advances have revolutionized how we produce, distribute and prepare food beyond local boundaries, and even how we eat. Eating is one of the most multisensory experiences in everyday life. All of our five senses (i.e. taste, smell, vision, hearing and touch) are involved. We first eat with our eyes, we can smell the food before we taste it, and then experience its textures and flavours in our mouth. However, the experience does not stop there. The sounds that come both from the environment in which we are immersed in while eating and our interactions with the food (e.g. chewing) and utensils we use to eat further influence our eating experiences. In all that, digital technology plays an increasingly important role, especially using emerging immersive technologies such as virtual and augmented reality (VR/AR). Designing at the intersection between technology and food requires multi-stakeholder commitments and a human experience-centred approach. Furthermore, it is essential to look beyond disciplinary boundaries and account for insights on various levels including the perceptual effects, experiential layers and technological advancements
Sour promotes risk-taking: an investigation into the effect of taste on risk-taking behaviour in humans
Taking risks is part of everyday life. Some people actively pursue risky activities (e.g., jumping out of a plane), while others avoid any risk (e.g., people with anxiety disorders). Paradoxically, risk-taking is a primitive behaviour that may lead to a happier life by offering a sense of excitement through self-actualization. Here, we demonstrate for the first time that sour - amongst the five basic tastes (sweet, bitter, sour, salty, and umami) - promotes risk-taking. Based on a series of three experiments, we show that sour has the potential to modulate risk-taking behaviour across two countries (UK and Vietnam), across individual differences in risk-taking personality and styles of thinking (analytic versus intuitive). Modulating risk-taking can improve everyday life for a wide range of people
Mid-air haptic rendering of 2D geometric shapes with a dynamic tactile pointer
An important challenge that affects ultrasonic midair haptics, in contrast to physical touch, is that we lose certain exploratory procedures such as contour following. This makes the task of perceiving geometric properties and shape identification more difficult. Meanwhile, the growing interest in mid-air haptics and their application to various new areas requires an improved understanding of how we perceive specific haptic stimuli, such as icons and control dials in mid-air. We address this challenge
by investigating static and dynamic methods of displaying 2D geometric shapes in mid-air. We display a circle, a square, and a triangle, in either a static or dynamic condition, using ultrasonic mid-air haptics. In the static condition, the shapes are presented as a full outline in mid-air, while in the dynamic condition, a tactile pointer is moved around the perimeter of the shapes. We measure participants’ accuracy and confidence of identifying
shapes in two controlled experiments (n1 = 34, n2 = 25). Results reveal that in the dynamic condition people recognise shapes significantly more accurately, and with higher confidence. We also find that representing polygons as a set of individually drawn haptic strokes, with a short pause at the corners, drastically enhances shape recognition accuracy. Our research supports the design of mid-air haptic user interfaces in application scenarios
such as in-car interactions or assistive technology in education
I’m sensing in the rain: spatial incongruity in visual-tactile mid-air stimulation can elicit ownership in VR users
Major virtual reality (VR) companies are trying to enhance the sense of immersion in virtual environments by implementing haptic feedback in their systems (e.g., Oculus Touch). It is known that tactile stimulation adds realism to a virtual environment. In addition, when users are not limited by wearing any attachments (e.g., gloves), it is even possible to create more immersive experiences. Mid-air haptic technology provides contactless haptic feedback and offers the potential for creating such immersive VR experiences. However, one of the limitations of mid-air haptics resides in the need for freehand tracking systems (e.g., Leap Motion) to deliver tactile feedback to the user's hand. These tracking systems are not accurate, limiting designers capability of delivering spatially precise tactile stimulation. Here, we investigated an alternative way to convey incongruent visual-tactile stimulation that can be used to create the illusion of a congruent visual-tactile experience, while participants experience the phenomenon of the rubber hand illusion in VR
Measuring the added value of haptic feedback
While there is an increased appreciation for integrating haptic feedback with audio-visual content, there is still a lack of understanding of how to quantify the added value of touch for a user’s experience (UX) of multimedia content. Here we focus on three main concepts to measure this added value: UX, emotions, and expectations. We present a case study measuring the added value of haptic feedback for a standardized set of audio-visual content (i.e., short video clips), comparing two haptic stimulation modalities (i.e., mid-air vs. vibro-tactile stimuli). Our findings demonstrate that UX of haptically-enhanced audio-visual content is perceived as a more pleasant, unpredictable, and creative experience. The users’ overall liking increases together with a positive change of the users’ expectations, independently from the haptic stimulation modality. We discuss how our approach provides the foundation for future work on developing a measurement model to predict the added value of haptic feedback for users’ experiences within and beyond the multimedia context
Recommended from our members
Integrating mid-air haptics into movie experiences
`Seeing is believing, but feeling is the truth''. This idiom from the seventeenth century English clergyman Thomas Fuller gains new momentum in light of an increased proliferation of haptic technologies that allow people to have various kinds of `touch' and `touchless' interactions. Here, we report on the process of creating and integrating touchless feedback (i.e. mid-air haptic stimuli) into short movie experiences (i.e. one-minute movie format). Based on a systematic evaluation of user's experiences of those haptically enhanced movies, we show evidence for the positive effect of haptic feedback during the first viewing experience, but also for a repeated viewing after two weeks. This opens up a promising design space for content creators and researchers interested in sensory augmentation of audiovisual content. We discuss our findings and the use of mid-air haptics technologies with respect to its effect on users' emotions, changes in the viewing experience over time, and the effects of synchronisation
Creating an illusion of movement between the hands using mid-air touch
Apparent tactile motion (ATM) has been shown to occur across many contiguous parts of the body, such as fingers, forearms and the back. More recently, the illusion has also been elicited on non-contiguous part of the body, such as from one hand to the other when interconnected or not interconnected by an object in between the hands. Here we explore the reproducibility of the intermanual tactile illusion of movement between two free hands by employing mid-air tactile stimulation. We investigate the optimal parameters to generate a continuous and smooth motion using two arrays of ultrasound speakers, and two stimulation techniques (i.e. static vs. dynamic focal point). In the first experiment, we investigate the occurrence of the illusion when using a static focal point, and we define a perceptive model. In the second experiment, we examine the illusion using a dynamic focal point, defining a second perceptive model. Finally, we discuss the differences between the two techniques
Recommended from our members
OSpace: towards a systematic exploration of olfactory interaction spaces
When designing olfactory interfaces, HCI researchers and practitioners have to carefully consider a number of issues related to the scent delivery, detection, and lingering. These are just a few of the problems to deal with. We present OSpace - an approach for designing, building, and exploring an olfactory interaction space. Our paper is the first to explore in detail not only the scent-delivery parameters but also the air extraction issues. We conducted a user study to demonstrate how the scent detection/lingering times can be acquired under different air extraction conditions, and how the impact of scent type, dilution, and intensity can be investigated. Results show that with our setup, the scents can be perceived by the user within ten seconds and it takes less than nine seconds for the scents to disappear, both when the extraction is on and off. We discuss the practical application of these results for HCI
"Touch me": workshop on tactile user experience evaluation methods
In this workshop we plan to explore the possibilities and challenges of physical objects and materials for evaluating the User Experience (UX) of interactive systems. These objects should face shortfalls of current UX evaluation methods and allow for a qualitative (or even quantitative), playful and holistic evaluation of UX -- without interfering with the users' personal experiences during interaction. This provides a tactile enhancement to a solely visual stimulation as used in classical evaluation methods. The workshop serves as a basis for networking and community building with interested HCI researchers, designers and practitioners and should encourage further development of the field of tactile UX evaluation
- …